TanhExp: A smooth activation function with high convergence speed for lightweight neural networks

نویسندگان

چکیده

Lightweight or mobile neural networks used for real-time computer vision tasks contain fewer parameters than normal networks, which lead to a constrained performance. Herein, novel activation function named as Tanh Exponential Activation Function (TanhExp) is proposed can improve the performance these on image classification task significantly. The definition of TanhExp f(x) = x tanh(ex). simplicity, efficiency, and robustness various datasets network models demonstrated outperforms its counterparts in both convergence speed accuracy. Its behaviour also remains stable even with noise added dataset altered. It shown that without increasing size network, capacity lightweight be enhanced by only few training epochs no extra added.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Lightweight enhanced monitoring for high speed networks

In this paper, LEMON, a lightweight enhanced monitoring algorithm based on packet sampling, is proposed. It targets a pre-assigned accuracy on bitrate estimates, for each monitored flow at a router interface. To this end, LEMON takes into account some basic properties of the flows, which can be easily inferred from a sampled stream, and it exploits them to dynamically adapt the monitoring time-...

متن کامل

On the convergence speed of artificial neural networks in‎ ‎the solving of linear ‎systems

‎Artificial neural networks have the advantages such as learning, ‎adaptation‎, ‎fault-tolerance‎, ‎parallelism and generalization‎. ‎This ‎paper is a scrutiny on the application of diverse learning methods‎ ‎in speed of convergence in neural networks‎. ‎For this aim‎, ‎first we ‎introduce a perceptron method based on artificial neural networks‎ ‎which has been applied for solving a non-singula...

متن کامل

Neural Networks with Smooth Adaptive Activation Functions for Regression

In Neural Networks (NN), Adaptive Activation Functions (AAF) have parameters that control the shapes of activation functions. These parameters are trained along with other parameters in the NN. AAFs have improved performance of Neural Networks (NN) in multiple classification tasks. In this paper, we propose and apply AAFs on feedforward NNs for regression tasks. We argue that applying AAFs in t...

متن کامل

Convergence Analysis of Two-layer Neural Networks with ReLU Activation

In recent years, stochastic gradient descent (SGD) based techniques has become the standard tools for training neural networks. However, formal theoretical understanding of why SGD can train neural networks in practice is largely missing. In this paper, we make progress on understanding this mystery by providing a convergence analysis for SGD on a rich subset of two-layer feedforward networks w...

متن کامل

Binary Output of Cellular Neural Networks with Smooth Activation

An important property of cellular neural networks (CNN’s) is the binary output property, that, when the self-feedback is greater than one, the final activations are 1. This brief considers the generalization of this property to networks with sigmoidal output functions. It is shown that in this case the property cannot be stated without reference to the cross feedback, and conditions are found u...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Iet Computer Vision

سال: 2021

ISSN: ['1751-9632', '1751-9640']

DOI: https://doi.org/10.1049/cvi2.12020